Some properties of Rényi entropy over countably infinite alphabets
نویسندگان
چکیده
In this paper we study certain properties of Rényi entropy functions Hα(P) on the space of discrete probability distributions with infinitely many probability masses. We prove some properties that parallel those known in the finite case. Some properties on the other hand are quite different in the infinite case, for example the (dis)continuity in P and the problem of divergence and behaviour of Hα(P) at the point of divergence. Finally, we prove that, given a sequence of distributions Pn converging to P with respect to the total variation distance, limα→1+ limn→∞ Hα(Pn) is in general not equal to limn→∞ limα→1+ Hα(Pn), so interchanging limiting operations (which is often done in applications) is not justified in this case.
منابع مشابه
A Preferred Definition of Conditional Rényi Entropy
The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...
متن کاملOn a Class of P Automata as a Machine Model for Languages over Infinite Alphabets
We show how P automata having a finite description and working with a finite object-alphabet can be used to describe languages over countably infinite alphabets. We propose to relate the language classes characterized by different types of P automata to some of the existing characterizations of language classes over infinite alphabets, and give an upper bound for the class of languages accepted...
متن کاملGeneralized Fano-Type Inequality for Countably Infinite Systems with List-Decoding
This study investigates generalized Fano-type inequalities in the following senses: (i) the alphabet X of a random variable X is countably infinite; (ii) instead of a fixed finite cardinality of X, a fixed X-marginal distribution PX is given; (iii) information measures are generalized from the conditional Shannon entropy H(X | Y ) to a general type of conditional information measures without ex...
متن کاملConvexity/concavity of renyi entropy and α-mutual information
Entropy is well known to be Schur concave on finite alphabets. Recently, the authors have strengthened the result by showing that for any pair of probability distributions P and Q with Q majorized by P , the entropy of Q is larger than the entropy of P by the amount of relative entropy D(P ||Q). This result applies to P and Q defined on countable alphabets. This paper shows the counterpart of t...
متن کاملUniversal Weak Variable-Length Source Coding on Countable Infinite Alphabets
Motivated from the fact that universal source coding on countably infinite alphabets is not feasible, this work introduces the notion of “almost lossless source coding”. Analog to the weak variable-length source coding problem studied by Han [3], almost lossless source coding aims at relaxing the lossless block-wise assumption to allow an average per-letter distortion that vanishes asymptotical...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Probl. Inf. Transm.
دوره 49 شماره
صفحات -
تاریخ انتشار 2013